# Text Summarization Optimization

Qwen3 0.6B TLDR Lora
Apache-2.0
Qwen3-0.6B is an open-source language model based on the Transformer architecture, with a parameter scale of 600 million, suitable for natural language processing tasks such as text summarization.
Text Generation
Q
phh
56
0
Pal B Large Opt 350m
MIT
This model is a personalized reward model for diverse alignment, trained based on facebook/opt-350m for text summarization tasks.
Text Generation Transformers English
P
daiweichen
37
1
Text Summarization Q4 K M GGUF
Apache-2.0
This is a GGUF-format text summarization model converted from the Falconsai/text_summarization model, suitable for inference using llama.cpp.
Text Generation English
T
tonyc666
16
0
Summllama3.1 8B GGUF
An 8B-parameter summary generation model optimized based on Llama3 architecture, offering multiple quantization versions
Large Language Model
S
tensorblock
52
0
Flan T5 Small Keywords
MIT
A fine-tuned keyword extraction model based on Flan-T5 small version, specifically designed to extract keywords from paragraphs
Large Language Model Transformers English
F
agentlans
1,101
5
Lamini Prompt Enchance Long
A prompt enhancement model fine-tuned based on LaMini-Flan-T5-248M, used for expanding and optimizing text prompt descriptions
Text Generation Transformers
L
gokaygokay
1,028
21
Paraphrase MiniLM L6 V2 Finetune Summary
A sentence embedding model based on sentence-transformers that maps text to a 384-dimensional vector space, suitable for semantic search and text similarity calculation
Text Embedding Transformers
P
tonychenxyz
20
1
Bart Cause Effect
A causal extraction model developed by the Taskload team led by Henry Leonardi, designed for automated information extraction tasks.
Knowledge Graph Transformers
B
taskload
275
3
Distilbart Cnn 12 6
Apache-2.0
DistilBART is a distilled version of the BART model, specifically optimized for text summarization tasks, significantly improving inference speed while maintaining high performance.
Text Generation English
D
sshleifer
783.96k
278
T5 Base Swedish
Apache-2.0
A Swedish text generation and translation model based on the T5 architecture, suitable for summarization and translation tasks.
Large Language Model Other
T
birgermoell
16
0
Pegasus Billsum
PEGASUS is an abstractive summarization pre-trained model based on gap sentence extraction, focused on generating high-quality text summaries.
Text Generation Transformers English
P
google
295
4
Pegasus Reddit Tifu
PEGASUS is a pretrained model based on extracting gap sentences, specifically designed for abstractive summarization tasks.
Text Generation Transformers English
P
google
17
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase